Multiple Boosting : A Combination of Boosting
نویسنده
چکیده
Classiier committee learning approaches have demonstrated great success in increasing the prediction accuracy of classiier learning , which is a key technique for datamining. It has been shown that Boosting and Bagging, as two representative methods of this type, can signiicantly decrease the error rate of decision tree learning. Boosting is generally more accurate than Bagging, but the former is more variable than the latter. In addition, Bagging is amenable to parallel or distributed processing, while Boosting is not. In this paper, we study a new committee learning algorithm , namely MB (Multiple Boosting). It creates multiple subcommittees by combining Boosting and Bagging. Experimental results in a representative collection of natural domains show that MB is, on average, more accurate than either Bagging or Boosting alone. It is more stable than Boosting, and is amenable to parallel or distributed processing. These characteristics make MB a good choice for parallel datamining.
منابع مشابه
Experiments with Two New Boosting Algorithms
Boosting is an effective classifier combination method, which can improve classification performance of an unstable learning algorithm. But it dose not make much more improvement of a stable learning algorithm. In this paper, multiple TAN classifiers are combined by a combination method called Boosting-MultiTAN that is compared with the Boosting-BAN classifier which is boosting based on BAN com...
متن کاملMultiple Boosting: a Combination of Boosting and Bagging
Classiier committee learning approaches have demonstrated great success in increasing the prediction accuracy of classiier learning , which is a key technique for datamining. These approaches generate several classiiers to form a committee by repeated application of a single base learning algorithm. The committee members vote to decide the nal classiication. It has been shown that Boosting and ...
متن کاملOutlier Detection by Boosting Regression Trees
A procedure for detecting outliers in regression problems is proposed. It is based on information provided by boosting regression trees. The key idea is to select the most frequently resampled observation along the boosting iterations and reiterate after removing it. The selection criterion is based on Tchebychev’s inequality applied to the maximum over the boosting iterations of ...
متن کاملStochastic Attribute Selection Committees withMultiple Boosting : Learning More
Classiier learning is a key technique for KDD. Approaches to learning classiier committees, including Boosting, Bagging, Sasc, and SascB, have demonstrated great success in increasing the prediction accuracy of decision trees. Boosting and Bagging create diierent classiiers by modifying the distribution of the training set. Sasc adopts a diierent method. It generates committees by stochastic ma...
متن کاملUsing Bagging and Boosting Techniques for Improving Coreference Resolution
Classifier combination techniques have been applied to a number of natural language processing problems. This paper explores the use of bagging and boosting as combination approaches for coreference resolution. To the best of our knowledge, this is the first effort that examines and evaluates the applicability of such techniques to coreference resolution. In particular, we (1) outline a scheme ...
متن کامل